In applied mathematics, Arnold diffusion is the phenomenon of instability of integrable Hamiltonian systems. The phenomenon is named for Vladimir Arnold who was the first to publish a result in the field in 1964[1]. More precisely, Arnold diffusion refers to results asserting the existence of solutions to nearly integrable Hamiltonian systems that exhibit a significant change in the action variables.
For integrable systems, one has the conservation of the action variables. According to the KAM theorem if we perturb an integrable slightly, then many, though certainly not all, of the solutions of the perturbed system stay close, for all time, to the unperturbed system. In particular, since the action variables were originally conserved, the KAM theorem tells us that there is only a small change in action for many solutions of the perturbed system.
However, as first noted in Arnold's paper,[1] there are nearly integrable systems for which there exist solutions that exhibit arbitrarily large growth in the action variables. More precisely, Arnold considered the example of nearly integrable Hamiltonian system with Hamiltonian
He showed that for this system, with any choice of where , there exists a such that for all there is a solution to the system for which
for some time